Neural Dynamic Focused Topic Model
نویسندگان
چکیده
Topic models and all their variants analyse text by learning meaningful representations through word co-occurrences. As pointed out previous work, such implicitly assume that the probability of a topic to be active its proportion within each document are positively correlated. This correlation can strongly detrimental in case documents created over time, simply because recent likely better described new hence rare topics. In this work we leverage advances neural variational inference present an alternative approach dynamic Focused Model. Indeed, develop model for evolution which exploits sequences Bernoulli random variables order track appearances topics, thereby decoupling activities from proportions. We evaluate our on three different datasets (the UN general debates, collection NeurIPS papers, ACL Anthology dataset) show it (i) outperforms state-of-the-art generalization tasks (ii) performs comparably them prediction tasks, while employing roughly same number parameters, converging about two times faster.
منابع مشابه
A topic-focused trust model for Twitter
Twitter is a crucial platform to get access to breaking news and timely information. However, due to questionable provenance, uncontrollable broadcasting, and unstructured languages in tweets, Twitter is hardly a trustworthy source of breaking news. In this paper, we propose a novel topic-focused trust model to assess trustworthiness of users and tweets in Twitter. Unlike traditional graph-base...
متن کاملTopic-focused Dynamic Information Filtering in Social Media
With the quick development of online social media such as twitter or sina weibo in china, many users usually track hot topics to satisfy their desired information need. For a hot topic, new opinions or ideas will be continuously produced in the form of online data stream. In this scenario, how to effectively filter and display information for a certain topic dynamically, will be a critical prob...
متن کاملA Neural Autoregressive Topic Model
We describe a new model for learning meaningful representations of text documents from an unlabeled collection of documents. This model is inspired by the recently proposed Replicated Softmax, an undirected graphical model of word counts that was shown to learn a better generative model and more meaningful document representations. Specifically, we take inspiration from the conditional mean-fie...
متن کاملTopic Compositional Neural Language Model
We propose a Topic Compositional Neural Language Model (TCNLM), a novel method designed to simultaneously capture both the global semantic meaning and the local wordordering structure in a document. The TCNLM learns the global semantic coherence of a document via a neural topic model, and the probability of each learned latent topic is further used to build a Mixture-ofExperts (MoE) language mo...
متن کاملFocused Topic Models
We present the focused topic model (FTM), a family of nonparametric Bayesian models for learning sparse topic mixture patterns. The FTM integrates desirable features from both the hierarchical Dirichlet process (HDP) and the Indian buffet process (IBP) – allowing an unbounded number of topics for the entire corpus, while each document maintains a sparse distribution over these topics. We observ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i11.26496